Hessenberg Matrix
   HOME

TheInfoList



OR:

In
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrices. ...
, a Hessenberg matrix is a special kind of
square matrix In mathematics, a square matrix is a matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied. Square matrices are often ...
, one that is "almost"
triangular A triangle is a polygon with three edges and three vertices. It is one of the basic shapes in geometry. A triangle with vertices ''A'', ''B'', and ''C'' is denoted \triangle ABC. In Euclidean geometry, any three points, when non- collinea ...
. To be exact, an upper Hessenberg matrix has zero entries below the first subdiagonal, and a lower Hessenberg matrix has zero entries above the first superdiagonal. They are named after
Karl Hessenberg Karl Adolf Hessenberg (September 8, 1904 – February 22, 1959) was a German mathematician and engineer. The Hessenberg matrix form is named after him. Education From 1925 to 1930 he studied electrical engineering at the Technische Hochschule ...
.


Definitions


Upper Hessenberg matrix

A square n \times n matrix A is said to be in upper Hessenberg form or to be an upper Hessenberg matrix if a_=0 for all i,j with i > j+1. An upper Hessenberg matrix is called unreduced if all subdiagonal entries are nonzero, i.e. if a_ \neq 0 for all i \in \.


Lower Hessenberg matrix

A square n \times n matrix A is said to be in lower Hessenberg form or to be a lower Hessenberg matrix if its transpose is an upper Hessenberg matrix or equivalently if a_=0 for all i,j with j > i+1. A lower Hessenberg matrix is called unreduced if all superdiagonal entries are nonzero, i.e. if a_ \neq 0 for all i \in \.


Examples

Consider the following matrices. :A=\begin 1 & 4 & 2 & 3 \\ 3 & 4 & 1 & 7 \\ 0 & 2 & 3 & 4 \\ 0 & 0 & 1 & 3 \\ \end :B=\begin 1 & 2 & 0 & 0 \\ 5 & 2 & 3 & 0 \\ 3 & 4 & 3 & 7 \\ 5 & 6 & 1 & 1 \\ \end :C=\begin 1 & 2 & 0 & 0 \\ 5 & 2 & 0 & 0 \\ 3 & 4 & 3 & 7 \\ 5 & 6 & 1 & 1 \\ \end The matrix A is an upper unreduced Hessenberg matrix, B is a lower unreduced Hessenberg matrix and C is a lower Hessenberg matrix but is not unreduced.


Computer programming

Many linear algebra
algorithm In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algorithms are used as specificat ...
s require significantly less computational effort when applied to triangular matrices, and this improvement often carries over to Hessenberg matrices as well. If the constraints of a linear algebra problem do not allow a general matrix to be conveniently reduced to a triangular one, reduction to Hessenberg form is often the next best thing. In fact, reduction of any matrix to a Hessenberg form can be achieved in a finite number of steps (for example, through Householder's transformation of unitary similarity transforms). Subsequent reduction of Hessenberg matrix to a triangular matrix can be achieved through iterative procedures, such as shifted QR-factorization. In
eigenvalue algorithm In numerical analysis, one of the most important problems is designing efficient and stable algorithms for finding the eigenvalues of a matrix. These eigenvalue algorithms may also find eigenvectors. Eigenvalues and eigenvectors Given an square ...
s, the Hessenberg matrix can be further reduced to a triangular matrix through Shifted QR-factorization combined with deflation steps. Reducing a general matrix to a Hessenberg matrix and then reducing further to a triangular matrix, instead of directly reducing a general matrix to a triangular matrix, often economizes the arithmetic involved in the
QR algorithm In numerical linear algebra, the QR algorithm or QR iteration is an eigenvalue algorithm: that is, a procedure to calculate the eigenvalues and eigenvectors of a matrix. The QR algorithm was developed in the late 1950s by John G. F. Francis and by ...
for eigenvalue problems.


Reduction to Hessenberg matrix

Any n \times n matrix can be transformed into a Hessenberg matrix by a similarity transformation using Householder transformations. The following procedure for such a transformation is adapted from A Second Course In Linear Algebra by ''Garcia & Roger''. Let A be any real or complex n \times n matrix, then let A^\prime be the (n - 1) \times n submatrix of A constructed by removing the first row in A and let \mathbf^\prime_1 be the first column of A^\prime. Construct the (n-1) \times (n-1) householder matrix V_1 = I_ - 2\frac where w = \begin , , \mathbf^\prime_1, , _2\mathbf_1 - \mathbf^\prime_1 \;\;\;\;\;\;\;\; , \;\;\; a^\prime_ = 0 \\ , , \mathbf^\prime_1, , _2\mathbf_1 + \frac\mathbf \;\;\; , \;\;\; a^\prime_ \neq 0 \\ \end This householder matrix will map \mathbf^\prime_1 to , , \mathbf^\prime_1, , \mathbf_1 and as such, the block matrix U_1 = \begin1 & \mathbf \\ \mathbf & V_1 \end will map the matrix A to the matrix U_1A which has only zeros below the second entry of the first column. Now construct (n-2) \times (n-2) householder matrix V_2 in a similar manner as V_1 such that V_2 maps the first column of A^ to , , \mathbf^_1, , \mathbf_1, where A^ is the submatrix of A^ constructed by removing the first row and the first column of A^, then let U_2 = \begin1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & V_2\end which maps U_1A to the matrix U_2U_1A which has only zeros below the first and second entry of the subdiagonal. Now construct V_3 and then U_3 in a similar manner, but for the matrix A^ constructed by removing the first row and first column of A^ and proceed as in the previous steps. Continue like this for a total of n-2 steps. Realising that the first k rows of any n \times n matrix is invariant under multiplication by U_k^* from the right, by construction of U_k, and so, any matrix can be transformed to an upper Hessenberg matrix by a similarity transformation of the form U_( \dots (U_2(U_1AU_1^*)U_2^*) \dots )U_^* = U_ \dots U_2U_1A(U_ \dots U_2U_1)^* = UAU^*.


Properties

For n \in \ , it is
vacuously true In mathematics and logic, a vacuous truth is a conditional or universal statement (a universal statement that can be converted to a conditional statement) that is true because the antecedent cannot be satisfied. For example, the statement "she ...
that every n \times n matrix is both upper Hessenberg, and lower Hessenberg. The product of a Hessenberg matrix with a triangular matrix is again Hessenberg. More precisely, if A is upper Hessenberg and T is upper triangular, then AT and TA are upper Hessenberg. A matrix that is both upper Hessenberg and lower Hessenberg is a
tridiagonal matrix In linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements only on the main diagonal, the subdiagonal/lower diagonal (the first diagonal below this), and the supradiagonal/upper diagonal (the first diagonal above the main di ...
, of which symmetric or Hermitian Hessenberg matrices are important examples. A Hermitian matrix can be reduced to tri-diagonal real symmetric matrices.


Hessenberg operator

The Hessenberg operator is an infinite dimensional Hessenberg matrix. It commonly occurs as the generalization of the
Jacobi operator A Jacobi operator, also known as Jacobi matrix, is a symmetric linear operator acting on sequences which is given by an infinite tridiagonal matrix. It is commonly used to specify systems of orthonormal polynomials over a finite, positive Borel me ...
to a system of
orthogonal polynomials In mathematics, an orthogonal polynomial sequence is a family of polynomials such that any two different polynomials in the sequence are orthogonality, orthogonal to each other under some inner product. The most widely used orthogonal polynomial ...
for the space of
square-integrable In mathematics, a square-integrable function, also called a quadratically integrable function or L^2 function or square-summable function, is a real number, real- or complex number, complex-valued measurable function for which the integral of the s ...
holomorphic functions In mathematics, a holomorphic function is a complex-valued function of one or more complex variables that is complex differentiable in a neighbourhood of each point in a domain in complex coordinate space . The existence of a complex derivati ...
over some domain—that is, a Bergman space. In this case, the Hessenberg operator is the right-
shift operator In mathematics, and in particular functional analysis, the shift operator also known as translation operator is an operator that takes a function to its translation . In time series analysis, the shift operator is called the lag operator. Shift o ...
S, given by : fz)=zf(z). The
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
s of each principal submatrix of the Hessenberg operator are given by the
characteristic polynomial In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The chara ...
for that submatrix. These polynomials are called the
Bergman polynomial Bergman is a surname of German, Swedish, Dutch and Yiddish origin meaning 'mountain man', or sometimes (only in German) 'miner'.https://www.ancestry.com/name-origin?surname=bergmann People * Alan Bergman (born 1925), American songwriter * Alan Be ...
s, and provide an orthogonal polynomial basis for Bergman space.


See also

* Hessenberg variety


Notes


References

* . * . *


External links


Hessenberg matrix
at MathWorld. *
High performance algorithms
for reduction to condensed (Hessenberg, tridiagonal, bidiagonal) form {{DEFAULTSORT:Hessenberg Matrix Matrices